2.9 Low Flow (7Q10) Data
The 2024 IR is the first assessment window to use a standardized low flow analysis process. Information presented below is still under review by regional assessment staff.
The workflow first analyzes a 7Q10 low flow statistic for all available gages in VA based on the last 50 years of water data. The functions called to analyze the xQy flow statistics are identical to DEQ’s Water Permitting protocols and were written by Connor Brogan. The water data is provided by the USGS NWIS data repository and is called in the xQy function by the USGS dataRetreival package. Important assumptions of the xQy program are identified below.
Once flow statistics are generated for all available gages statewide, this information is compared to available flow data during a given assessment window. Any gages identified below the 7Q10 statistic are flagged for the appropriate time period. This information is spatially joined to the assessment watersheds (VAHU6) to extrapolate available flow data to areas without gaging stations. This is not an ideal extrapolation of flow data, but it serves as a decent initial flag for assessors to know when/where to investigate further.
These temporal low flow flags are joined to individual site monitoring data by VAHU6 and VAHU5 during the automated assessment process. If parameters used to assess aquatic life condition are collected during low flow periods, then the data are flagged inside the assessment applications, indicating further review is necessary prior to accepting the automated assessment exceedance calculations for that site.
2.9.1 7Q10 Method
The method for identifying low flow information for the assessment period is detailed below. The DFLOW_CoreFunctions_EVJ.R script is an assessment-specific adaptation of Connor Brogan’s xQy protocols that allow for minor adjustments to the DFLOW procedure to allow for assessment purposes (for more information on these changes, see the Important 7Q10 Calculation Notes section below.
This analysis needs to be performed on or after April 2 of each assessment window cutoff to ensure the entire final water year is included in the analysis. The results are posted on the R server for inclusion in the automated assessment methods.
library(tidyverse)
library(zoo)
library(dataRetrieval)
library(e1071)
library(sf)
library(leaflet)
library(inlmisc)
library(DT)
source('DFLOW_CoreFunctions_EVJ.R')2.9.2 USGS Site Data Gathering
All USGS gage information sampled in the last 50 years need to be collected from USGS NWIS. We can use the whatNWISsites() function to identify which sites have daily discharge data (00060) in a designated area (stateCd = ‘VA’).
sites <- whatNWISsites(stateCd="VA",
parameterCd="00060",
hasDataTypeCd="dv") %>%
filter(site_tp_cd %in% c('ST', 'SP')) # only keep ST (stream) and SP (spring) sites
sites_sf <- sites %>%
st_as_sf(coords = c("dec_long_va", "dec_lat_va"),
remove = F, # don't remove these lat/lon cols from df
crs = 4326) # add projection, needs to be geographic for now bc entering lat/lngNow we will pull daily flow data for each site identified and calculate 7Q10. This is saved in the local environment as a list object with each gage a unique list element.
# store it somewhere
flowAnalysis <- list()
for(i in unique(sites$site_no)){
print(i)
siteFlow <- xQy_EVJ(gageID = i,#USGS Gage ID
DS="1972-03-31",#Date to limit the lower end of usgs gage data download in yyyy-mm-dd
DE="2023-04-01",#Date to limit the upper end of USGS gage data download in yyyy-mm-dd
WYS="04-01",#The start of the analysis season in mm-dd. Defaults to April 1.
WYE="03-31",#The end of the analysis season in mm-dd. Defaults to March 31.
x=7,#If you want to include a different xQY then the defaults, enter x here
y=10,
onlyUseAcceptedData = F )
flowAnalysis[i] <- list(siteFlow)
}Using the purrr library, we can extract just the flow metric information for each gage and store in a tibble for use later.
# extract 7Q10 by gageNo
x7Q10 <- map_df(flowAnalysis, "Flows") # EVJ added in gageNo to xQy_EVJ()
x7Q10[1:50,] %>% # preview first 50 rows
DT::datatable(rownames = F, options = list(dom = 'lftip', pageLength = 5, scrollX = TRUE))And we need the actual daily flow data to compare to the low flow metrics, so we will extract that next.
# now to extract flow data already pulled by function
flows <- map_df(flowAnalysis, "outdat")
flows[1:50,] %>% # preview first 50 rows
DT::datatable(rownames = F, options = list(dom = 'lftip', pageLength = 5, scrollX = TRUE))Since we only really care about flow data from our assessment window, let’s extract just the flow data and filter to our IR window of interest. We can then join the low flow metrics by gage number and flag any daily average flow data that falls below the gage’s 7Q10 metric.
# now just grab flow data in assessment window, join in 7Q10, identify any measures below 7Q10
assessmentFlows <- map_df(flowAnalysis, "outdat") %>%
filter(between(Date, as.Date("2017-01-01"), as.Date("2022-12-31"))) %>%
left_join(x7Q10, by = c('Gage ID' = "gageNo")) %>%
mutate(`7Q10 Flag` = case_when(Flow <= n7Q10 ~ '7Q10 Flag',
TRUE ~ as.character(NA)))
assessmentFlows[1:50,] %>% # preview first 50 rows
DT::datatable(rownames = F, options = list(dom = 'lftip', pageLength = 5, scrollX = TRUE))Here we limit our assessmentFlows object to just the rows where a 7Q10 flag is encountered. We can review these low flow events by organizing them by gage and date.
# anything below 7Q10?
lowAssessmentFlows <- filter(assessmentFlows, `7Q10 Flag` == '7Q10 Flag')
unique(lowAssessmentFlows$`Gage ID`) # what gages do these occur at?
# organize low flow events by Gage ID
lowAssessmentFlows %>%
arrange(`Gage ID`, Date))[1:50,] %>% # preview first 50 rows
DT::datatable(rownames = F, options = list(dom = 'lftip', pageLength = 5, scrollX = TRUE))Next, let’s review the low flow gages visually on a map. First, we need to transform this low flow information into a spatial object.
# see where spatially
lowFlowSites <- lowAssessmentFlows %>%
distinct(`Gage ID`) %>%
left_join(sites_sf, by = c('Gage ID' = 'site_no')) %>%
st_as_sf()We will bring in assessment watersheds to better understand how these low flow events happen across the landscape.
vahu6 <- st_read('../data/GIS/VA_SUBWATERSHED_6TH_ORDER_STG.shp') # this version of vahu6 layer goes outside state boundary
vahu5 <- vahu6 %>%
group_by(VAHU5) %>%
summarise()And here is a map of the assessment watersheds (VAHU5 and VAHU6) with all Virginia USGS gages (USGS sites) and just USGS gages with low flow events in the IR window (Low Flow USGS sites).
CreateWebMap(maps = c("Topo","Imagery","Hydrography"), collapsed = TRUE,
options= leafletOptions(zoomControl = TRUE,minZoom = 5, maxZoom = 20,
preferCanvas = TRUE)) %>%
setView(-79.1, 37.7, zoom=7) %>%
addCircleMarkers(data = sites_sf, color='gray', fillColor='gray', radius = 4,
fillOpacity = 0.8,opacity=0.8,weight = 2,stroke=T, group="USGS sites",
label = ~site_no) %>%
addCircleMarkers(data = lowFlowSites, color='gray', fillColor='red', radius = 4,
fillOpacity = 0.8,opacity=0.8,weight = 2,stroke=T, group="Low Flow USGS sites",
label = ~`Gage ID`) %>%
addPolygons(data= vahu5, color = 'black', weight = 1,
fillColor= 'blue', fillOpacity = 0.5,stroke=0.1,
group="vahu5", label = ~VAHU5) %>% hideGroup("vahu5") %>%
addPolygons(data= vahu6, color = 'black', weight = 1,
fillColor= 'blue', fillOpacity = 0.5,stroke=0.1,
group="vahu6", label = ~VAHU6) %>% hideGroup("vahu6") %>%
addLayersControl(baseGroups=c("Topo","Imagery","Hydrography"),
overlayGroups = c("Low Flow USGS sites","USGS sites","vahu5","vahu6"),
options=layersControlOptions(collapsed=T),
position='topleft')